Towards Neural Network-based Reasoning

arXiv: http://arxiv.org/abs/1508.05508

... The most interesting thing here was how they handled differently sized samples (as of now, I haven't read "Memory Networks" yet). Note that for fact reasoning we may have different numbers of facts, each one as a sentence of arbitrary size. They handled that by cascading DNNs, RNNs and/or pooling operations to end up with a fixed size vector that could be utlimately classified or used as a seed to a final sequence generator RNN. The stack also does "sensor fusion" between query and facts at each level.

Pointer Networks

arXiv: http://arxiv.org/abs/1506.03134

Variational Dropout and the Local Reparameterization Trick

arXiv: http://arxiv.org/abs/1506.02557

Winner-Take-All Autoencoders

arXiv: http://arxiv.org/abs/1409.2752

Inferring Algorithmic Patterns with Stack-Augmented Recurrent Nets

arXiv: http://arxiv.org/pdf/1503.01007.pdf

Convolutional spike-triggered covariance analysis for neural subunit models

arXiv: no-prepint


In [ ]: